Face Generation

In this project, you'll use generative adversarial networks to generate new images of faces.

Get the Data

You'll be using two datasets in this project:

  • MNIST
  • CelebA

Since the celebA dataset is complex and you're doing GANs in a project for the first time, we want you to test your neural network on MNIST before CelebA. Running the GANs on MNIST will allow you to see how well your model trains sooner.

If you're using FloydHub, set data_dir to "/input" and use the FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".

In [1]:
!ls -al /input
total 8308
drwxr-xr-x   4 root root    6144 Apr 29 00:27 .
drwxr-xr-x 138 root root    4096 Aug 16 15:51 ..
drwxr-xr-x   2 root root 6137856 Apr 28 19:01 img_align_celeba
drwxr-xr-x   2 root root 2365440 Apr 28 18:57 mnist
In [2]:
data_dir = '/input'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
#data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)
Found mnist Data
Found celeba Data

Explore the Data

MNIST

As you're aware, the MNIST dataset contains images of handwritten digits. You can view the first number of examples by changing show_n_images.

In [3]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')
Out[3]:
<matplotlib.image.AxesImage at 0x7f8ff02ac668>

CelebA

The CelebFaces Attributes Dataset (CelebA) dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations. You can view the first number of examples by changing show_n_images.

In [4]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))
Out[4]:
<matplotlib.image.AxesImage at 0x7f8ff0229748>

Preprocess the Data

Since the project's main focus is on building the GANs, we'll preprocess the data for you. The values of the MNIST and CelebA dataset will be in the range of -0.5 to 0.5 of 28x28 dimensional images. The CelebA images will be cropped to remove parts of the image that don't include a face, then resized down to 28x28.

The MNIST images are black and white images with a single color channel while the CelebA images have 3 color channels (RGB color channel).

Build the Neural Network

You'll build the components necessary to build a GANs by implementing the following functions below:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU

In [5]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.1.0
Default GPU Device: /gpu:0

Input

Implement the model_inputs function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Real input images placeholder with rank 4 using image_width, image_height, and image_channels.
  • Z input placeholder with rank 2 using z_dim.
  • Learning rate placeholder with rank 0.

Return the placeholders in the following the tuple (tensor of real input images, tensor of z data)

In [6]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    
    real_inputs = tf.placeholder(
        tf.float32, 
        (None, image_height, image_width, image_channels),
        name='real_inputs'
    )
    z_inputs = tf.placeholder(tf.float32, (None, z_dim), name='z_inputs')
    lrate = tf.placeholder(tf.float32, name='lrate')
    return real_inputs, z_inputs, lrate


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)
Tests Passed

Discriminator

Implement discriminator to create a discriminator neural network that discriminates on images. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "discriminator" to allow the variables to be reused. The function should return a tuple of (tensor output of the discriminator, tensor logits of the discriminator).

In [7]:
def discriminator(images, reuse=False, alpha=0.2, kernel=6, filters=32):
    """
    Create the discriminator network
    :param images: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # Input layer is 28x28x3
    with tf.variable_scope('discriminator', reuse=reuse):
        x1 = tf.layers.conv2d(images, filters, kernel, strides=2, padding='same')
        relu1 = tf.maximum(alpha * x1, x1)
        # 14x14x32
        
        x2 = tf.layers.conv2d(x1, filters*2, kernel, strides=2, padding='same')
        bn2 = tf.layers.batch_normalization(x2, training=True)
        relu2 = tf.maximum(alpha * bn2, bn2)
        # 7x7x64
        
        x3 = tf.layers.conv2d(x2, filters*2, kernel, strides=1, padding='same')
        bn3 = tf.layers.batch_normalization(x3, training=True)
        relu3 = tf.maximum(alpha * bn3, bn3)
        
        flat = tf.reshape(relu3, (-1, 7*7*filters*2))
        logits = tf.layers.dense(flat, 1)
        out = tf.sigmoid(logits)

    return out, logits


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)
Tests Passed

Generator

Implement generator to generate an image using z. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "generator" to allow the variables to be reused. The function should return the generated 28 x 28 x out_channel_dim images.

In [8]:
def generator(z, out_channel_dim, is_train=True, alpha=0.2, kernel=6):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    with tf.variable_scope('generator', reuse=not is_train):
        x1 = tf.layers.dense(z, 7*7*512)
        
        x1 = tf.reshape(x1, (-1, 7, 7, 512))
        x1 = tf.layers.batch_normalization(x1, training=is_train)
        x1 = tf.maximum(alpha * x1, x1)
        # 7x7x512
        
        x2 = tf.layers.conv2d_transpose(x1, 256, kernel, strides=2, padding='same')
        x2 = tf.layers.batch_normalization(x2, training=is_train)
        x2 = tf.maximum(alpha * x2, x2)
        # 14x14x256
        
        x3 = tf.layers.conv2d_transpose(x2, 128, kernel, strides=1, padding='same')
        x3 = tf.layers.batch_normalization(x3, training=is_train)
        x3 = tf.maximum(alpha * x3, x3)
        # 14x14x128
        
        logits = tf.layers.conv2d_transpose(
            x3, out_channel_dim, kernel, strides=2, padding='same')
        # 28x28x3
        
        out = tf.tanh(logits)
    return out


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)
Tests Passed

Loss

Implement model_loss to build the GANs for training and calculate the loss. The function should return a tuple of (discriminator loss, generator loss). Use the following functions you implemented:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)
In [26]:
def model_loss(input_real, input_z, out_channel_dim):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    gen_model = generator(input_z, out_channel_dim)
    d_model_real, d_logits_real = discriminator(input_real)
    d_model_fake, d_logits_fake = discriminator(gen_model, reuse=True)
    
    ones_like_real = tf.ones_like(d_model_real)
    one_sided_smooth_labels = tf.multiply(
        ones_like_real,
        tf.random_uniform((1,), minval=0.8, maxval=1.2)
    )

    
    d_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits=d_logits_real, labels=one_sided_smooth_labels
        )
    )
    d_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits=d_logits_fake, labels=tf.zeros_like(d_model_fake)
        )
    )
    g_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(
            logits=d_logits_fake, labels=tf.ones_like(d_model_fake)
        )
    )
    d_loss = d_loss_real + d_loss_fake
    
    return d_loss, g_loss


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)
Tests Passed

Optimization

Implement model_opt to create the optimization operations for the GANs. Use tf.trainable_variables to get all the trainable variables. Filter the variables with names that are in the discriminator and generator scope names. The function should return a tuple of (discriminator training operation, generator training operation).

In [27]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # Get weights and bias to update
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]

    # Optimize
    with tf.control_dependencies(tf.get_collection(tf.GraphKeys.UPDATE_OPS)):
        d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)

    return d_train_opt, g_train_opt


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)
Tests Passed

Neural Network Training

Show Output

Use this function to show the current output of the generator during training. It will help you determine how well the GANs is training.

In [28]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

Train

Implement train to build and train the GANs. Use the following functions you implemented:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

Use the show_generator_output to show generator output while you train. Running show_generator_output for every batch will drastically increase training time and increase the size of the notebook. It's recommended to print the generator output every 100 batches.

In [29]:
def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    steps=0
    
    # TODO: Build Model
    image_channels = 3 if data_image_mode == 'RGB' else 1
    image_height, image_width = data_shape[1], data_shape[2]
    real_inputs, z_inputs, lrate = model_inputs(
        image_width, image_height, image_channels, z_dim)
        
    d_loss, g_loss = model_loss(real_inputs, z_inputs, image_channels)
    
    d_opt, g_opt = model_opt(d_loss, g_loss, lrate, beta1)
        
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                
                steps += 1
                batch_images = 2 * batch_images
                
                batch_z = np.random.uniform(-1 ,1, size=(batch_size, z_dim))
                
                _ = sess.run(d_opt, feed_dict={
                    real_inputs: batch_images,
                    z_inputs: batch_z,
                    lrate: learning_rate
                })
                
                # Double the number of trains to generator
                _ = sess.run(g_opt, feed_dict={
                    z_inputs: batch_z,
                    real_inputs: batch_images,
                    lrate: learning_rate
                })
                
                
                if steps % 10 == 0:
                    # At the end of every 10 epochs, get the losses and print them out
                    train_loss_d = d_loss.eval({z_inputs: batch_z, real_inputs: batch_images})
                    train_loss_g = g_loss.eval({z_inputs: batch_z})

                    print("Epoch {}/{}...".format(epoch_i+1, epochs),
                          "Discriminator Loss: {:.4f}...".format(train_loss_d),
                          "Generator Loss: {:.4f}".format(train_loss_g),
                          "Sum Loss: {:.4f}".format(train_loss_g+train_loss_d))
                
                if steps % 100 == 0:
                    show_generator_output(
                        sess,
                        25,
                        z_inputs,
                        image_channels,
                        data_image_mode
                    )
                  
        show_generator_output(sess, 25, z_inputs, image_channels, data_image_mode)

                
                

MNIST

Test your GANs architecture on MNIST. After 2 epochs, the GANs should be able to generate images that look like handwritten digits. Make sure the loss of the generator is lower than the loss of the discriminator or close to 0.

In [ ]:
 
In [30]:
batch_size = 64
z_dim = 100
learning_rate = 0.0002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)
Epoch 1/2... Discriminator Loss: 1.2925... Generator Loss: 0.3550 Sum Loss: 1.6475
Epoch 1/2... Discriminator Loss: 1.3206... Generator Loss: 0.3101 Sum Loss: 1.6307
Epoch 1/2... Discriminator Loss: 1.7126... Generator Loss: 0.2774 Sum Loss: 1.9900
Epoch 1/2... Discriminator Loss: 0.0055... Generator Loss: 1.7671 Sum Loss: 1.7726
Epoch 1/2... Discriminator Loss: 0.8428... Generator Loss: 0.7786 Sum Loss: 1.6214
Epoch 1/2... Discriminator Loss: 1.1911... Generator Loss: 0.8550 Sum Loss: 2.0461
Epoch 1/2... Discriminator Loss: 1.0019... Generator Loss: 0.9711 Sum Loss: 1.9730
Epoch 1/2... Discriminator Loss: 1.6527... Generator Loss: 0.4028 Sum Loss: 2.0556
Epoch 1/2... Discriminator Loss: 1.2708... Generator Loss: 0.8797 Sum Loss: 2.1505
Epoch 1/2... Discriminator Loss: 1.6612... Generator Loss: 0.5809 Sum Loss: 2.2421
Epoch 1/2... Discriminator Loss: 1.6230... Generator Loss: 0.8484 Sum Loss: 2.4714
Epoch 1/2... Discriminator Loss: 1.3883... Generator Loss: 0.7248 Sum Loss: 2.1131
Epoch 1/2... Discriminator Loss: 1.4354... Generator Loss: 0.8221 Sum Loss: 2.2574
Epoch 1/2... Discriminator Loss: 1.6632... Generator Loss: 0.6784 Sum Loss: 2.3416
Epoch 1/2... Discriminator Loss: 1.4239... Generator Loss: 0.5896 Sum Loss: 2.0135
Epoch 1/2... Discriminator Loss: 1.4503... Generator Loss: 0.7506 Sum Loss: 2.2009
Epoch 1/2... Discriminator Loss: 1.6495... Generator Loss: 0.6561 Sum Loss: 2.3056
Epoch 1/2... Discriminator Loss: 1.6453... Generator Loss: 0.5243 Sum Loss: 2.1696
Epoch 1/2... Discriminator Loss: 1.4873... Generator Loss: 0.5578 Sum Loss: 2.0452
Epoch 1/2... Discriminator Loss: 1.3258... Generator Loss: 0.7763 Sum Loss: 2.1020
Epoch 1/2... Discriminator Loss: 1.5860... Generator Loss: 0.6580 Sum Loss: 2.2440
Epoch 1/2... Discriminator Loss: 1.3772... Generator Loss: 0.6977 Sum Loss: 2.0749
Epoch 1/2... Discriminator Loss: 1.6233... Generator Loss: 0.5082 Sum Loss: 2.1315
Epoch 1/2... Discriminator Loss: 1.6879... Generator Loss: 0.6000 Sum Loss: 2.2879
Epoch 1/2... Discriminator Loss: 1.3592... Generator Loss: 0.8989 Sum Loss: 2.2581
Epoch 1/2... Discriminator Loss: 1.5938... Generator Loss: 0.5077 Sum Loss: 2.1015
Epoch 1/2... Discriminator Loss: 1.6904... Generator Loss: 0.7000 Sum Loss: 2.3905
Epoch 1/2... Discriminator Loss: 1.7100... Generator Loss: 0.4997 Sum Loss: 2.2097
Epoch 1/2... Discriminator Loss: 1.6668... Generator Loss: 0.7042 Sum Loss: 2.3709
Epoch 1/2... Discriminator Loss: 1.6749... Generator Loss: 0.5410 Sum Loss: 2.2158
Epoch 1/2... Discriminator Loss: 1.5907... Generator Loss: 0.6905 Sum Loss: 2.2812
Epoch 1/2... Discriminator Loss: 1.6607... Generator Loss: 0.4649 Sum Loss: 2.1256
Epoch 1/2... Discriminator Loss: 1.4800... Generator Loss: 0.8015 Sum Loss: 2.2815
Epoch 1/2... Discriminator Loss: 1.6143... Generator Loss: 0.4945 Sum Loss: 2.1088
Epoch 1/2... Discriminator Loss: 1.5091... Generator Loss: 0.8123 Sum Loss: 2.3214
Epoch 1/2... Discriminator Loss: 1.4568... Generator Loss: 0.7085 Sum Loss: 2.1652
Epoch 1/2... Discriminator Loss: 1.5834... Generator Loss: 0.6330 Sum Loss: 2.2164
Epoch 1/2... Discriminator Loss: 1.4889... Generator Loss: 0.5415 Sum Loss: 2.0304
Epoch 1/2... Discriminator Loss: 1.4739... Generator Loss: 0.7133 Sum Loss: 2.1873
Epoch 1/2... Discriminator Loss: 1.5791... Generator Loss: 0.6679 Sum Loss: 2.2470
Epoch 1/2... Discriminator Loss: 1.4583... Generator Loss: 0.6587 Sum Loss: 2.1170
Epoch 1/2... Discriminator Loss: 1.4274... Generator Loss: 0.6726 Sum Loss: 2.1000
Epoch 1/2... Discriminator Loss: 1.4957... Generator Loss: 0.7119 Sum Loss: 2.2077
Epoch 1/2... Discriminator Loss: 1.4424... Generator Loss: 0.7528 Sum Loss: 2.1952
Epoch 1/2... Discriminator Loss: 1.5061... Generator Loss: 0.6568 Sum Loss: 2.1629
Epoch 1/2... Discriminator Loss: 1.4503... Generator Loss: 0.6136 Sum Loss: 2.0639
Epoch 1/2... Discriminator Loss: 1.4006... Generator Loss: 0.6991 Sum Loss: 2.0997
Epoch 1/2... Discriminator Loss: 1.4940... Generator Loss: 0.6880 Sum Loss: 2.1820
Epoch 1/2... Discriminator Loss: 1.5057... Generator Loss: 0.5718 Sum Loss: 2.0775
Epoch 1/2... Discriminator Loss: 1.4659... Generator Loss: 0.7484 Sum Loss: 2.2143
Epoch 1/2... Discriminator Loss: 1.4309... Generator Loss: 0.7202 Sum Loss: 2.1511
Epoch 1/2... Discriminator Loss: 1.4885... Generator Loss: 0.5562 Sum Loss: 2.0447
Epoch 1/2... Discriminator Loss: 1.5051... Generator Loss: 0.7252 Sum Loss: 2.2303
Epoch 1/2... Discriminator Loss: 1.4253... Generator Loss: 0.6915 Sum Loss: 2.1168
Epoch 1/2... Discriminator Loss: 1.4140... Generator Loss: 0.7373 Sum Loss: 2.1512
Epoch 1/2... Discriminator Loss: 1.4264... Generator Loss: 0.6872 Sum Loss: 2.1136
Epoch 1/2... Discriminator Loss: 1.5055... Generator Loss: 0.5203 Sum Loss: 2.0258
Epoch 1/2... Discriminator Loss: 1.4335... Generator Loss: 0.6331 Sum Loss: 2.0666
Epoch 1/2... Discriminator Loss: 1.4394... Generator Loss: 0.6386 Sum Loss: 2.0780
Epoch 1/2... Discriminator Loss: 1.4571... Generator Loss: 0.7393 Sum Loss: 2.1964
Epoch 1/2... Discriminator Loss: 1.4525... Generator Loss: 0.7718 Sum Loss: 2.2244
Epoch 1/2... Discriminator Loss: 1.4052... Generator Loss: 0.6816 Sum Loss: 2.0868
Epoch 1/2... Discriminator Loss: 1.4474... Generator Loss: 0.7717 Sum Loss: 2.2191
Epoch 1/2... Discriminator Loss: 1.4365... Generator Loss: 0.7267 Sum Loss: 2.1633
Epoch 1/2... Discriminator Loss: 1.4602... Generator Loss: 0.5399 Sum Loss: 2.0001
Epoch 1/2... Discriminator Loss: 1.4167... Generator Loss: 0.6337 Sum Loss: 2.0503
Epoch 1/2... Discriminator Loss: 1.4761... Generator Loss: 0.4873 Sum Loss: 1.9634
Epoch 1/2... Discriminator Loss: 1.3948... Generator Loss: 0.7367 Sum Loss: 2.1315
Epoch 1/2... Discriminator Loss: 1.4036... Generator Loss: 0.7350 Sum Loss: 2.1386
Epoch 1/2... Discriminator Loss: 1.4116... Generator Loss: 0.8113 Sum Loss: 2.2229
Epoch 1/2... Discriminator Loss: 1.4371... Generator Loss: 0.5971 Sum Loss: 2.0342
Epoch 1/2... Discriminator Loss: 1.3973... Generator Loss: 0.7140 Sum Loss: 2.1113
Epoch 1/2... Discriminator Loss: 1.4542... Generator Loss: 0.7286 Sum Loss: 2.1828
Epoch 1/2... Discriminator Loss: 1.4025... Generator Loss: 0.6575 Sum Loss: 2.0600
Epoch 1/2... Discriminator Loss: 1.4190... Generator Loss: 0.5748 Sum Loss: 1.9939
Epoch 1/2... Discriminator Loss: 1.4023... Generator Loss: 0.7621 Sum Loss: 2.1644
Epoch 1/2... Discriminator Loss: 1.4367... Generator Loss: 0.6080 Sum Loss: 2.0447
Epoch 1/2... Discriminator Loss: 1.4221... Generator Loss: 0.7806 Sum Loss: 2.2026
Epoch 1/2... Discriminator Loss: 1.3739... Generator Loss: 0.6084 Sum Loss: 1.9822
Epoch 1/2... Discriminator Loss: 1.4213... Generator Loss: 0.6995 Sum Loss: 2.1208
Epoch 1/2... Discriminator Loss: 1.4076... Generator Loss: 0.9039 Sum Loss: 2.3115
Epoch 1/2... Discriminator Loss: 1.4225... Generator Loss: 0.6766 Sum Loss: 2.0991
Epoch 1/2... Discriminator Loss: 1.3858... Generator Loss: 0.6803 Sum Loss: 2.0661
Epoch 1/2... Discriminator Loss: 1.5165... Generator Loss: 0.9311 Sum Loss: 2.4475
Epoch 1/2... Discriminator Loss: 1.4224... Generator Loss: 0.6517 Sum Loss: 2.0742
Epoch 1/2... Discriminator Loss: 1.3890... Generator Loss: 0.7554 Sum Loss: 2.1444
Epoch 1/2... Discriminator Loss: 1.4155... Generator Loss: 0.5885 Sum Loss: 2.0040
Epoch 1/2... Discriminator Loss: 1.4078... Generator Loss: 0.7331 Sum Loss: 2.1409
Epoch 1/2... Discriminator Loss: 1.3759... Generator Loss: 0.8556 Sum Loss: 2.2315
Epoch 1/2... Discriminator Loss: 1.3711... Generator Loss: 0.7627 Sum Loss: 2.1339
Epoch 1/2... Discriminator Loss: 1.3716... Generator Loss: 0.8739 Sum Loss: 2.2455
Epoch 1/2... Discriminator Loss: 1.3706... Generator Loss: 0.6397 Sum Loss: 2.0102
Epoch 1/2... Discriminator Loss: 1.4063... Generator Loss: 0.6238 Sum Loss: 2.0301
Epoch 2/2... Discriminator Loss: 1.4260... Generator Loss: 0.7341 Sum Loss: 2.1601
Epoch 2/2... Discriminator Loss: 1.4095... Generator Loss: 0.6398 Sum Loss: 2.0493
Epoch 2/2... Discriminator Loss: 1.3755... Generator Loss: 0.6186 Sum Loss: 1.9941
Epoch 2/2... Discriminator Loss: 1.3675... Generator Loss: 0.9417 Sum Loss: 2.3092
Epoch 2/2... Discriminator Loss: 1.4163... Generator Loss: 0.6700 Sum Loss: 2.0863
Epoch 2/2... Discriminator Loss: 1.3636... Generator Loss: 0.7967 Sum Loss: 2.1603
Epoch 2/2... Discriminator Loss: 1.5269... Generator Loss: 0.9293 Sum Loss: 2.4562
Epoch 2/2... Discriminator Loss: 1.3852... Generator Loss: 0.8008 Sum Loss: 2.1860
Epoch 2/2... Discriminator Loss: 1.4183... Generator Loss: 0.7630 Sum Loss: 2.1813
Epoch 2/2... Discriminator Loss: 1.4239... Generator Loss: 0.6640 Sum Loss: 2.0880
Epoch 2/2... Discriminator Loss: 1.3490... Generator Loss: 0.8133 Sum Loss: 2.1623
Epoch 2/2... Discriminator Loss: 1.3478... Generator Loss: 0.8175 Sum Loss: 2.1653
Epoch 2/2... Discriminator Loss: 1.3820... Generator Loss: 0.8752 Sum Loss: 2.2572
Epoch 2/2... Discriminator Loss: 1.4766... Generator Loss: 0.8949 Sum Loss: 2.3716
Epoch 2/2... Discriminator Loss: 1.3778... Generator Loss: 0.8609 Sum Loss: 2.2386
Epoch 2/2... Discriminator Loss: 1.3960... Generator Loss: 0.4469 Sum Loss: 1.8429
Epoch 2/2... Discriminator Loss: 1.4174... Generator Loss: 0.6683 Sum Loss: 2.0857
Epoch 2/2... Discriminator Loss: 1.3917... Generator Loss: 0.6142 Sum Loss: 2.0059
Epoch 2/2... Discriminator Loss: 1.4071... Generator Loss: 0.8923 Sum Loss: 2.2994
Epoch 2/2... Discriminator Loss: 1.3828... Generator Loss: 0.7618 Sum Loss: 2.1446
Epoch 2/2... Discriminator Loss: 1.3867... Generator Loss: 0.6412 Sum Loss: 2.0278
Epoch 2/2... Discriminator Loss: 1.4121... Generator Loss: 0.6880 Sum Loss: 2.1001
Epoch 2/2... Discriminator Loss: 1.3903... Generator Loss: 0.7233 Sum Loss: 2.1135
Epoch 2/2... Discriminator Loss: 1.3893... Generator Loss: 0.6908 Sum Loss: 2.0801
Epoch 2/2... Discriminator Loss: 1.5657... Generator Loss: 1.0551 Sum Loss: 2.6208
Epoch 2/2... Discriminator Loss: 1.3986... Generator Loss: 0.4666 Sum Loss: 1.8652
Epoch 2/2... Discriminator Loss: 1.3736... Generator Loss: 0.6811 Sum Loss: 2.0546
Epoch 2/2... Discriminator Loss: 1.4330... Generator Loss: 0.7906 Sum Loss: 2.2235
Epoch 2/2... Discriminator Loss: 1.3968... Generator Loss: 0.6335 Sum Loss: 2.0304
Epoch 2/2... Discriminator Loss: 1.4201... Generator Loss: 0.7433 Sum Loss: 2.1635
Epoch 2/2... Discriminator Loss: 1.4421... Generator Loss: 0.8431 Sum Loss: 2.2852
Epoch 2/2... Discriminator Loss: 1.3795... Generator Loss: 0.6811 Sum Loss: 2.0606
Epoch 2/2... Discriminator Loss: 1.5480... Generator Loss: 0.4785 Sum Loss: 2.0265
Epoch 2/2... Discriminator Loss: 1.4124... Generator Loss: 0.6427 Sum Loss: 2.0551
Epoch 2/2... Discriminator Loss: 1.4038... Generator Loss: 0.7263 Sum Loss: 2.1301
Epoch 2/2... Discriminator Loss: 1.4395... Generator Loss: 0.5925 Sum Loss: 2.0320
Epoch 2/2... Discriminator Loss: 1.3429... Generator Loss: 0.4723 Sum Loss: 1.8151
Epoch 2/2... Discriminator Loss: 1.4004... Generator Loss: 0.9700 Sum Loss: 2.3703
Epoch 2/2... Discriminator Loss: 1.4511... Generator Loss: 0.7407 Sum Loss: 2.1918
Epoch 2/2... Discriminator Loss: 1.3727... Generator Loss: 0.7705 Sum Loss: 2.1432
Epoch 2/2... Discriminator Loss: 1.5306... Generator Loss: 0.9436 Sum Loss: 2.4741
Epoch 2/2... Discriminator Loss: 1.4150... Generator Loss: 0.7178 Sum Loss: 2.1328
Epoch 2/2... Discriminator Loss: 1.4441... Generator Loss: 0.5923 Sum Loss: 2.0364
Epoch 2/2... Discriminator Loss: 1.3735... Generator Loss: 0.7182 Sum Loss: 2.0917
Epoch 2/2... Discriminator Loss: 1.4631... Generator Loss: 0.5582 Sum Loss: 2.0213
Epoch 2/2... Discriminator Loss: 1.4240... Generator Loss: 0.9012 Sum Loss: 2.3252
Epoch 2/2... Discriminator Loss: 1.4037... Generator Loss: 0.6494 Sum Loss: 2.0531
Epoch 2/2... Discriminator Loss: 1.4125... Generator Loss: 0.7312 Sum Loss: 2.1437
Epoch 2/2... Discriminator Loss: 1.3934... Generator Loss: 0.8807 Sum Loss: 2.2741
Epoch 2/2... Discriminator Loss: 1.4792... Generator Loss: 0.8923 Sum Loss: 2.3715
Epoch 2/2... Discriminator Loss: 1.4062... Generator Loss: 0.6286 Sum Loss: 2.0347
Epoch 2/2... Discriminator Loss: 1.4036... Generator Loss: 0.7700 Sum Loss: 2.1736
Epoch 2/2... Discriminator Loss: 1.3923... Generator Loss: 0.7732 Sum Loss: 2.1655
Epoch 2/2... Discriminator Loss: 1.3863... Generator Loss: 0.7275 Sum Loss: 2.1138
Epoch 2/2... Discriminator Loss: 1.3923... Generator Loss: 0.7810 Sum Loss: 2.1733
Epoch 2/2... Discriminator Loss: 1.3976... Generator Loss: 0.6637 Sum Loss: 2.0614
Epoch 2/2... Discriminator Loss: 1.3789... Generator Loss: 0.5658 Sum Loss: 1.9447
Epoch 2/2... Discriminator Loss: 1.3644... Generator Loss: 0.7002 Sum Loss: 2.0645
Epoch 2/2... Discriminator Loss: 1.4049... Generator Loss: 0.6149 Sum Loss: 2.0198
Epoch 2/2... Discriminator Loss: 1.3876... Generator Loss: 0.7634 Sum Loss: 2.1510
Epoch 2/2... Discriminator Loss: 1.4291... Generator Loss: 0.8302 Sum Loss: 2.2593
Epoch 2/2... Discriminator Loss: 1.3616... Generator Loss: 0.8412 Sum Loss: 2.2028
Epoch 2/2... Discriminator Loss: 1.4192... Generator Loss: 0.8261 Sum Loss: 2.2454
Epoch 2/2... Discriminator Loss: 1.4106... Generator Loss: 0.6087 Sum Loss: 2.0193
Epoch 2/2... Discriminator Loss: 1.3940... Generator Loss: 0.7298 Sum Loss: 2.1237
Epoch 2/2... Discriminator Loss: 1.3881... Generator Loss: 0.7570 Sum Loss: 2.1451
Epoch 2/2... Discriminator Loss: 1.4065... Generator Loss: 0.6321 Sum Loss: 2.0385
Epoch 2/2... Discriminator Loss: 1.4116... Generator Loss: 0.8582 Sum Loss: 2.2698
Epoch 2/2... Discriminator Loss: 1.3502... Generator Loss: 0.6179 Sum Loss: 1.9682
Epoch 2/2... Discriminator Loss: 1.3970... Generator Loss: 0.7526 Sum Loss: 2.1496
Epoch 2/2... Discriminator Loss: 1.4266... Generator Loss: 0.7461 Sum Loss: 2.1727
Epoch 2/2... Discriminator Loss: 1.4118... Generator Loss: 0.6611 Sum Loss: 2.0730
Epoch 2/2... Discriminator Loss: 1.3983... Generator Loss: 0.6890 Sum Loss: 2.0874
Epoch 2/2... Discriminator Loss: 1.3792... Generator Loss: 0.8627 Sum Loss: 2.2420
Epoch 2/2... Discriminator Loss: 1.3802... Generator Loss: 0.8287 Sum Loss: 2.2089
Epoch 2/2... Discriminator Loss: 1.3730... Generator Loss: 0.6078 Sum Loss: 1.9807
Epoch 2/2... Discriminator Loss: 1.3785... Generator Loss: 0.5241 Sum Loss: 1.9026
Epoch 2/2... Discriminator Loss: 1.3897... Generator Loss: 0.6370 Sum Loss: 2.0267
Epoch 2/2... Discriminator Loss: 1.3995... Generator Loss: 0.6515 Sum Loss: 2.0510
Epoch 2/2... Discriminator Loss: 1.3361... Generator Loss: 0.5106 Sum Loss: 1.8467
Epoch 2/2... Discriminator Loss: 1.3642... Generator Loss: 0.6673 Sum Loss: 2.0315
Epoch 2/2... Discriminator Loss: 1.4263... Generator Loss: 0.7128 Sum Loss: 2.1391
Epoch 2/2... Discriminator Loss: 1.4245... Generator Loss: 0.6036 Sum Loss: 2.0280
Epoch 2/2... Discriminator Loss: 1.3931... Generator Loss: 0.7344 Sum Loss: 2.1275
Epoch 2/2... Discriminator Loss: 1.3484... Generator Loss: 0.7740 Sum Loss: 2.1225
Epoch 2/2... Discriminator Loss: 1.4132... Generator Loss: 0.6784 Sum Loss: 2.0915
Epoch 2/2... Discriminator Loss: 1.4080... Generator Loss: 0.7458 Sum Loss: 2.1538
Epoch 2/2... Discriminator Loss: 1.4006... Generator Loss: 0.6281 Sum Loss: 2.0287
Epoch 2/2... Discriminator Loss: 1.3919... Generator Loss: 0.5511 Sum Loss: 1.9430
Epoch 2/2... Discriminator Loss: 1.3951... Generator Loss: 0.5591 Sum Loss: 1.9542
Epoch 2/2... Discriminator Loss: 1.3675... Generator Loss: 0.8223 Sum Loss: 2.1898
Epoch 2/2... Discriminator Loss: 1.4009... Generator Loss: 0.7485 Sum Loss: 2.1494
Epoch 2/2... Discriminator Loss: 1.3918... Generator Loss: 0.6965 Sum Loss: 2.0882
Epoch 2/2... Discriminator Loss: 1.3640... Generator Loss: 0.6371 Sum Loss: 2.0011

CelebA

Run your GANs on CelebA. It will take around 20 minutes on the average GPU to run one epoch. You can run the whole epoch or stop when it starts to generate realistic faces.

In [31]:
batch_size = 64
z_dim = 100
learning_rate = 0.0002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1... Discriminator Loss: 5.6191... Generator Loss: 0.0075 Sum Loss: 5.6267
Epoch 1/1... Discriminator Loss: 4.1345... Generator Loss: 0.0274 Sum Loss: 4.1619
Epoch 1/1... Discriminator Loss: 3.7508... Generator Loss: 0.0340 Sum Loss: 3.7848
Epoch 1/1... Discriminator Loss: 2.0535... Generator Loss: 0.2226 Sum Loss: 2.2762
Epoch 1/1... Discriminator Loss: 2.2461... Generator Loss: 0.2843 Sum Loss: 2.5305
Epoch 1/1... Discriminator Loss: 2.7773... Generator Loss: 0.1428 Sum Loss: 2.9200
Epoch 1/1... Discriminator Loss: 1.8134... Generator Loss: 0.6277 Sum Loss: 2.4411
Epoch 1/1... Discriminator Loss: 1.8538... Generator Loss: 0.4069 Sum Loss: 2.2607
Epoch 1/1... Discriminator Loss: 2.0317... Generator Loss: 0.3655 Sum Loss: 2.3972
Epoch 1/1... Discriminator Loss: 1.1558... Generator Loss: 1.0301 Sum Loss: 2.1859
Epoch 1/1... Discriminator Loss: 2.1457... Generator Loss: 0.5465 Sum Loss: 2.6922
Epoch 1/1... Discriminator Loss: 1.7252... Generator Loss: 0.6638 Sum Loss: 2.3891
Epoch 1/1... Discriminator Loss: 1.5023... Generator Loss: 0.7645 Sum Loss: 2.2667
Epoch 1/1... Discriminator Loss: 1.9954... Generator Loss: 0.5628 Sum Loss: 2.5581
Epoch 1/1... Discriminator Loss: 1.4463... Generator Loss: 0.6247 Sum Loss: 2.0710
Epoch 1/1... Discriminator Loss: 1.6861... Generator Loss: 0.5898 Sum Loss: 2.2759
Epoch 1/1... Discriminator Loss: 1.6860... Generator Loss: 0.8972 Sum Loss: 2.5832
Epoch 1/1... Discriminator Loss: 1.6985... Generator Loss: 0.7138 Sum Loss: 2.4123
Epoch 1/1... Discriminator Loss: 1.8391... Generator Loss: 0.3798 Sum Loss: 2.2189
Epoch 1/1... Discriminator Loss: 1.7690... Generator Loss: 0.4899 Sum Loss: 2.2589
Epoch 1/1... Discriminator Loss: 1.3170... Generator Loss: 0.8159 Sum Loss: 2.1329
Epoch 1/1... Discriminator Loss: 1.3704... Generator Loss: 0.8173 Sum Loss: 2.1877
Epoch 1/1... Discriminator Loss: 1.9513... Generator Loss: 0.3060 Sum Loss: 2.2574
Epoch 1/1... Discriminator Loss: 1.8737... Generator Loss: 0.4488 Sum Loss: 2.3224
Epoch 1/1... Discriminator Loss: 1.4629... Generator Loss: 0.7527 Sum Loss: 2.2155
Epoch 1/1... Discriminator Loss: 1.3341... Generator Loss: 0.7041 Sum Loss: 2.0381
Epoch 1/1... Discriminator Loss: 1.4400... Generator Loss: 0.6672 Sum Loss: 2.1072
Epoch 1/1... Discriminator Loss: 1.0926... Generator Loss: 1.2687 Sum Loss: 2.3613
Epoch 1/1... Discriminator Loss: 1.3684... Generator Loss: 0.7094 Sum Loss: 2.0778
Epoch 1/1... Discriminator Loss: 1.0558... Generator Loss: 1.0587 Sum Loss: 2.1145
Epoch 1/1... Discriminator Loss: 1.7379... Generator Loss: 0.7309 Sum Loss: 2.4688
Epoch 1/1... Discriminator Loss: 1.3243... Generator Loss: 1.1589 Sum Loss: 2.4832
Epoch 1/1... Discriminator Loss: 1.6602... Generator Loss: 0.4371 Sum Loss: 2.0974
Epoch 1/1... Discriminator Loss: 1.0050... Generator Loss: 1.2548 Sum Loss: 2.2598
Epoch 1/1... Discriminator Loss: 1.4143... Generator Loss: 0.6642 Sum Loss: 2.0785
Epoch 1/1... Discriminator Loss: 0.9202... Generator Loss: 1.4894 Sum Loss: 2.4095
Epoch 1/1... Discriminator Loss: 1.5454... Generator Loss: 0.4025 Sum Loss: 1.9478
Epoch 1/1... Discriminator Loss: 0.9223... Generator Loss: 1.9424 Sum Loss: 2.8646
Epoch 1/1... Discriminator Loss: 1.2400... Generator Loss: 0.9009 Sum Loss: 2.1409
Epoch 1/1... Discriminator Loss: 0.9846... Generator Loss: 1.0298 Sum Loss: 2.0144
Epoch 1/1... Discriminator Loss: 0.7661... Generator Loss: 2.0300 Sum Loss: 2.7962
Epoch 1/1... Discriminator Loss: 0.7297... Generator Loss: 2.7041 Sum Loss: 3.4338
Epoch 1/1... Discriminator Loss: 0.7159... Generator Loss: 1.5932 Sum Loss: 2.3091
Epoch 1/1... Discriminator Loss: 0.6386... Generator Loss: 1.8312 Sum Loss: 2.4698
Epoch 1/1... Discriminator Loss: 0.6071... Generator Loss: 2.3887 Sum Loss: 2.9958
Epoch 1/1... Discriminator Loss: 0.8869... Generator Loss: 1.2814 Sum Loss: 2.1683
Epoch 1/1... Discriminator Loss: 0.5532... Generator Loss: 1.8148 Sum Loss: 2.3680
Epoch 1/1... Discriminator Loss: 0.9026... Generator Loss: 2.2062 Sum Loss: 3.1088
Epoch 1/1... Discriminator Loss: 0.4321... Generator Loss: 1.8131 Sum Loss: 2.2452
Epoch 1/1... Discriminator Loss: 0.6346... Generator Loss: 2.5438 Sum Loss: 3.1783
Epoch 1/1... Discriminator Loss: 0.5347... Generator Loss: 2.2599 Sum Loss: 2.7946
Epoch 1/1... Discriminator Loss: 0.3407... Generator Loss: 2.9611 Sum Loss: 3.3018
Epoch 1/1... Discriminator Loss: 0.5836... Generator Loss: 1.9097 Sum Loss: 2.4934
Epoch 1/1... Discriminator Loss: -0.1143... Generator Loss: 3.5470 Sum Loss: 3.4327
Epoch 1/1... Discriminator Loss: 0.5683... Generator Loss: 1.6998 Sum Loss: 2.2680
Epoch 1/1... Discriminator Loss: 0.6808... Generator Loss: 1.7262 Sum Loss: 2.4070
Epoch 1/1... Discriminator Loss: 0.8529... Generator Loss: 3.1596 Sum Loss: 4.0125
Epoch 1/1... Discriminator Loss: 0.6980... Generator Loss: 2.3966 Sum Loss: 3.0947
Epoch 1/1... Discriminator Loss: 0.5342... Generator Loss: 1.7329 Sum Loss: 2.2671
Epoch 1/1... Discriminator Loss: 0.6481... Generator Loss: 1.0367 Sum Loss: 1.6848
Epoch 1/1... Discriminator Loss: 1.0677... Generator Loss: 0.8669 Sum Loss: 1.9346
Epoch 1/1... Discriminator Loss: 0.5729... Generator Loss: 2.8528 Sum Loss: 3.4257
Epoch 1/1... Discriminator Loss: 0.4617... Generator Loss: 2.2364 Sum Loss: 2.6982
Epoch 1/1... Discriminator Loss: 0.9119... Generator Loss: 0.6004 Sum Loss: 1.5123
Epoch 1/1... Discriminator Loss: 0.7313... Generator Loss: 1.2768 Sum Loss: 2.0081
Epoch 1/1... Discriminator Loss: 0.6644... Generator Loss: 1.5278 Sum Loss: 2.1922
Epoch 1/1... Discriminator Loss: 1.2255... Generator Loss: 0.7045 Sum Loss: 1.9300
Epoch 1/1... Discriminator Loss: 0.6990... Generator Loss: 1.3624 Sum Loss: 2.0614
Epoch 1/1... Discriminator Loss: 0.9723... Generator Loss: 1.2886 Sum Loss: 2.2609
Epoch 1/1... Discriminator Loss: 1.0751... Generator Loss: 1.3769 Sum Loss: 2.4520
Epoch 1/1... Discriminator Loss: 0.7258... Generator Loss: 2.1909 Sum Loss: 2.9167
Epoch 1/1... Discriminator Loss: 0.9297... Generator Loss: 1.6589 Sum Loss: 2.5886
Epoch 1/1... Discriminator Loss: 0.5797... Generator Loss: 1.3679 Sum Loss: 1.9476
Epoch 1/1... Discriminator Loss: 0.9952... Generator Loss: 0.6750 Sum Loss: 1.6703
Epoch 1/1... Discriminator Loss: 1.2093... Generator Loss: 0.5823 Sum Loss: 1.7915
Epoch 1/1... Discriminator Loss: 1.5277... Generator Loss: 0.4728 Sum Loss: 2.0005
Epoch 1/1... Discriminator Loss: 1.4701... Generator Loss: 0.2944 Sum Loss: 1.7645
Epoch 1/1... Discriminator Loss: 0.7175... Generator Loss: 2.0052 Sum Loss: 2.7228
Epoch 1/1... Discriminator Loss: 0.6969... Generator Loss: 1.6049 Sum Loss: 2.3018
Epoch 1/1... Discriminator Loss: 0.8482... Generator Loss: 0.8024 Sum Loss: 1.6506
Epoch 1/1... Discriminator Loss: 1.5054... Generator Loss: 0.3768 Sum Loss: 1.8822
Epoch 1/1... Discriminator Loss: 0.8079... Generator Loss: 1.7485 Sum Loss: 2.5564
Epoch 1/1... Discriminator Loss: 0.9628... Generator Loss: 1.2967 Sum Loss: 2.2594
Epoch 1/1... Discriminator Loss: 1.3723... Generator Loss: 0.3916 Sum Loss: 1.7639
Epoch 1/1... Discriminator Loss: 1.1522... Generator Loss: 0.7809 Sum Loss: 1.9332
Epoch 1/1... Discriminator Loss: 2.1878... Generator Loss: 0.7869 Sum Loss: 2.9747
Epoch 1/1... Discriminator Loss: 0.8965... Generator Loss: 1.3675 Sum Loss: 2.2640
Epoch 1/1... Discriminator Loss: 0.7133... Generator Loss: 1.0124 Sum Loss: 1.7258
Epoch 1/1... Discriminator Loss: 2.2364... Generator Loss: 1.4449 Sum Loss: 3.6813
Epoch 1/1... Discriminator Loss: 0.7847... Generator Loss: 1.4972 Sum Loss: 2.2820
Epoch 1/1... Discriminator Loss: 1.0753... Generator Loss: 0.7698 Sum Loss: 1.8451
Epoch 1/1... Discriminator Loss: 0.8157... Generator Loss: 1.7470 Sum Loss: 2.5627
Epoch 1/1... Discriminator Loss: 0.5807... Generator Loss: 2.6187 Sum Loss: 3.1994
Epoch 1/1... Discriminator Loss: 1.1775... Generator Loss: 0.7722 Sum Loss: 1.9498
Epoch 1/1... Discriminator Loss: 0.8575... Generator Loss: 0.7269 Sum Loss: 1.5843
Epoch 1/1... Discriminator Loss: 0.4756... Generator Loss: 1.2769 Sum Loss: 1.7526
Epoch 1/1... Discriminator Loss: 0.5497... Generator Loss: 2.8639 Sum Loss: 3.4136
Epoch 1/1... Discriminator Loss: 1.3594... Generator Loss: 1.4925 Sum Loss: 2.8519
Epoch 1/1... Discriminator Loss: 1.8838... Generator Loss: 0.2682 Sum Loss: 2.1520
Epoch 1/1... Discriminator Loss: 1.1251... Generator Loss: 0.7105 Sum Loss: 1.8355
Epoch 1/1... Discriminator Loss: 2.0397... Generator Loss: 0.2770 Sum Loss: 2.3167
Epoch 1/1... Discriminator Loss: 0.6721... Generator Loss: 0.9947 Sum Loss: 1.6668
Epoch 1/1... Discriminator Loss: 1.1216... Generator Loss: 0.7954 Sum Loss: 1.9170
Epoch 1/1... Discriminator Loss: 0.6549... Generator Loss: 0.6776 Sum Loss: 1.3326
Epoch 1/1... Discriminator Loss: 0.7653... Generator Loss: 2.4428 Sum Loss: 3.2081
Epoch 1/1... Discriminator Loss: 0.2922... Generator Loss: 1.0404 Sum Loss: 1.3326
Epoch 1/1... Discriminator Loss: 1.4644... Generator Loss: 1.3771 Sum Loss: 2.8415
Epoch 1/1... Discriminator Loss: 0.9320... Generator Loss: 1.3788 Sum Loss: 2.3109
Epoch 1/1... Discriminator Loss: 0.5512... Generator Loss: 2.1622 Sum Loss: 2.7135
Epoch 1/1... Discriminator Loss: 1.9880... Generator Loss: 0.2229 Sum Loss: 2.2109
Epoch 1/1... Discriminator Loss: 0.2551... Generator Loss: 1.5177 Sum Loss: 1.7727
Epoch 1/1... Discriminator Loss: 2.4338... Generator Loss: 0.3196 Sum Loss: 2.7534
Epoch 1/1... Discriminator Loss: 1.1612... Generator Loss: 0.8323 Sum Loss: 1.9935
Epoch 1/1... Discriminator Loss: 1.1042... Generator Loss: 0.9647 Sum Loss: 2.0688
Epoch 1/1... Discriminator Loss: 1.0830... Generator Loss: 0.8020 Sum Loss: 1.8850
Epoch 1/1... Discriminator Loss: 0.7703... Generator Loss: 1.3334 Sum Loss: 2.1038
Epoch 1/1... Discriminator Loss: 1.3332... Generator Loss: 0.6487 Sum Loss: 1.9819
Epoch 1/1... Discriminator Loss: 0.9667... Generator Loss: 0.7142 Sum Loss: 1.6809
Epoch 1/1... Discriminator Loss: 0.7240... Generator Loss: 1.3511 Sum Loss: 2.0751
Epoch 1/1... Discriminator Loss: 1.9178... Generator Loss: 1.3677 Sum Loss: 3.2855
Epoch 1/1... Discriminator Loss: 1.5039... Generator Loss: 1.7078 Sum Loss: 3.2118
Epoch 1/1... Discriminator Loss: 1.3287... Generator Loss: 0.8958 Sum Loss: 2.2245
Epoch 1/1... Discriminator Loss: 1.3686... Generator Loss: 0.5795 Sum Loss: 1.9481
Epoch 1/1... Discriminator Loss: 1.3519... Generator Loss: 0.7035 Sum Loss: 2.0554
Epoch 1/1... Discriminator Loss: 1.4652... Generator Loss: 0.4504 Sum Loss: 1.9156
Epoch 1/1... Discriminator Loss: 0.9265... Generator Loss: 1.1828 Sum Loss: 2.1093
Epoch 1/1... Discriminator Loss: 1.2719... Generator Loss: 0.6901 Sum Loss: 1.9620
Epoch 1/1... Discriminator Loss: 0.5351... Generator Loss: 1.3832 Sum Loss: 1.9183
Epoch 1/1... Discriminator Loss: 0.7049... Generator Loss: 0.7421 Sum Loss: 1.4470
Epoch 1/1... Discriminator Loss: 1.7602... Generator Loss: 0.2588 Sum Loss: 2.0190
Epoch 1/1... Discriminator Loss: 1.0232... Generator Loss: 0.8521 Sum Loss: 1.8753
Epoch 1/1... Discriminator Loss: 1.1953... Generator Loss: 1.0838 Sum Loss: 2.2791
Epoch 1/1... Discriminator Loss: 1.4315... Generator Loss: 0.6143 Sum Loss: 2.0458
Epoch 1/1... Discriminator Loss: 0.9208... Generator Loss: 1.0317 Sum Loss: 1.9524
Epoch 1/1... Discriminator Loss: 0.9954... Generator Loss: 0.8257 Sum Loss: 1.8210
Epoch 1/1... Discriminator Loss: 0.6370... Generator Loss: 1.5504 Sum Loss: 2.1874
Epoch 1/1... Discriminator Loss: 0.9338... Generator Loss: 0.9792 Sum Loss: 1.9131
Epoch 1/1... Discriminator Loss: 0.8856... Generator Loss: 1.1267 Sum Loss: 2.0123
Epoch 1/1... Discriminator Loss: 1.3894... Generator Loss: 2.5156 Sum Loss: 3.9051
Epoch 1/1... Discriminator Loss: 0.8605... Generator Loss: 0.7586 Sum Loss: 1.6191
Epoch 1/1... Discriminator Loss: 0.5976... Generator Loss: 1.3206 Sum Loss: 1.9182
Epoch 1/1... Discriminator Loss: 1.0996... Generator Loss: 0.5022 Sum Loss: 1.6018
Epoch 1/1... Discriminator Loss: 0.7789... Generator Loss: 0.8169 Sum Loss: 1.5958
Epoch 1/1... Discriminator Loss: 1.4322... Generator Loss: 0.4468 Sum Loss: 1.8791
Epoch 1/1... Discriminator Loss: 1.2487... Generator Loss: 1.2383 Sum Loss: 2.4870
Epoch 1/1... Discriminator Loss: 1.1297... Generator Loss: 0.6102 Sum Loss: 1.7399
Epoch 1/1... Discriminator Loss: 1.7066... Generator Loss: 1.2162 Sum Loss: 2.9228
Epoch 1/1... Discriminator Loss: 1.4960... Generator Loss: 0.5457 Sum Loss: 2.0417
Epoch 1/1... Discriminator Loss: 1.2200... Generator Loss: 0.7394 Sum Loss: 1.9593
Epoch 1/1... Discriminator Loss: 1.1219... Generator Loss: 0.7830 Sum Loss: 1.9049
Epoch 1/1... Discriminator Loss: 2.1057... Generator Loss: 0.8209 Sum Loss: 2.9265
Epoch 1/1... Discriminator Loss: 1.6775... Generator Loss: 0.4476 Sum Loss: 2.1251
Epoch 1/1... Discriminator Loss: 0.5783... Generator Loss: 1.7707 Sum Loss: 2.3490
Epoch 1/1... Discriminator Loss: 1.1773... Generator Loss: 1.0698 Sum Loss: 2.2472
Epoch 1/1... Discriminator Loss: 1.2157... Generator Loss: 0.8502 Sum Loss: 2.0660
Epoch 1/1... Discriminator Loss: 1.6265... Generator Loss: 1.0802 Sum Loss: 2.7068
Epoch 1/1... Discriminator Loss: 0.7322... Generator Loss: 1.6627 Sum Loss: 2.3950
Epoch 1/1... Discriminator Loss: 1.5165... Generator Loss: 0.2059 Sum Loss: 1.7224
Epoch 1/1... Discriminator Loss: 1.0215... Generator Loss: 0.9833 Sum Loss: 2.0048
Epoch 1/1... Discriminator Loss: 1.1715... Generator Loss: 0.7980 Sum Loss: 1.9695
Epoch 1/1... Discriminator Loss: 0.5279... Generator Loss: 1.3574 Sum Loss: 1.8853
Epoch 1/1... Discriminator Loss: 0.9057... Generator Loss: 0.9940 Sum Loss: 1.8997
Epoch 1/1... Discriminator Loss: 1.0846... Generator Loss: 0.7087 Sum Loss: 1.7933
Epoch 1/1... Discriminator Loss: 0.9680... Generator Loss: 2.6364 Sum Loss: 3.6044
Epoch 1/1... Discriminator Loss: 0.9706... Generator Loss: 1.0976 Sum Loss: 2.0682
Epoch 1/1... Discriminator Loss: 0.6789... Generator Loss: 2.2794 Sum Loss: 2.9582
Epoch 1/1... Discriminator Loss: 1.2395... Generator Loss: 0.6733 Sum Loss: 1.9128
Epoch 1/1... Discriminator Loss: 0.7857... Generator Loss: 1.5378 Sum Loss: 2.3234
Epoch 1/1... Discriminator Loss: 1.5595... Generator Loss: 0.4711 Sum Loss: 2.0306
Epoch 1/1... Discriminator Loss: 1.4093... Generator Loss: 0.5682 Sum Loss: 1.9775
Epoch 1/1... Discriminator Loss: 1.0762... Generator Loss: 0.9607 Sum Loss: 2.0368
Epoch 1/1... Discriminator Loss: 1.0568... Generator Loss: 1.5098 Sum Loss: 2.5665
Epoch 1/1... Discriminator Loss: 1.1324... Generator Loss: 1.0495 Sum Loss: 2.1819
Epoch 1/1... Discriminator Loss: 1.1002... Generator Loss: 1.1035 Sum Loss: 2.2037
Epoch 1/1... Discriminator Loss: 1.1927... Generator Loss: 0.8702 Sum Loss: 2.0629
Epoch 1/1... Discriminator Loss: 1.0598... Generator Loss: 1.0598 Sum Loss: 2.1196
Epoch 1/1... Discriminator Loss: 2.3667... Generator Loss: 0.4259 Sum Loss: 2.7926
Epoch 1/1... Discriminator Loss: 1.0547... Generator Loss: 0.9097 Sum Loss: 1.9644
Epoch 1/1... Discriminator Loss: 1.2006... Generator Loss: 0.6930 Sum Loss: 1.8936
Epoch 1/1... Discriminator Loss: 1.0201... Generator Loss: 1.1581 Sum Loss: 2.1782
Epoch 1/1... Discriminator Loss: 1.5706... Generator Loss: 0.5441 Sum Loss: 2.1147
Epoch 1/1... Discriminator Loss: 0.8527... Generator Loss: 0.6340 Sum Loss: 1.4868
Epoch 1/1... Discriminator Loss: 1.2685... Generator Loss: 0.5599 Sum Loss: 1.8284
Epoch 1/1... Discriminator Loss: 1.0573... Generator Loss: 1.1698 Sum Loss: 2.2271
Epoch 1/1... Discriminator Loss: 0.8144... Generator Loss: 0.9948 Sum Loss: 1.8092
Epoch 1/1... Discriminator Loss: 1.1941... Generator Loss: 0.9652 Sum Loss: 2.1593
Epoch 1/1... Discriminator Loss: 1.2670... Generator Loss: 0.6227 Sum Loss: 1.8897
Epoch 1/1... Discriminator Loss: 1.1919... Generator Loss: 0.9037 Sum Loss: 2.0956
Epoch 1/1... Discriminator Loss: 1.0529... Generator Loss: 1.1714 Sum Loss: 2.2243
Epoch 1/1... Discriminator Loss: 1.2666... Generator Loss: 0.6900 Sum Loss: 1.9567
Epoch 1/1... Discriminator Loss: 1.4603... Generator Loss: 0.3269 Sum Loss: 1.7871
Epoch 1/1... Discriminator Loss: 1.2169... Generator Loss: 0.8622 Sum Loss: 2.0791
Epoch 1/1... Discriminator Loss: 1.4083... Generator Loss: 1.3192 Sum Loss: 2.7276
Epoch 1/1... Discriminator Loss: 1.1451... Generator Loss: 0.7058 Sum Loss: 1.8509
Epoch 1/1... Discriminator Loss: 1.2237... Generator Loss: 0.7051 Sum Loss: 1.9288
Epoch 1/1... Discriminator Loss: 1.2624... Generator Loss: 1.3081 Sum Loss: 2.5705
Epoch 1/1... Discriminator Loss: 1.5318... Generator Loss: 0.7812 Sum Loss: 2.3131
Epoch 1/1... Discriminator Loss: 1.8066... Generator Loss: 0.3961 Sum Loss: 2.2027
Epoch 1/1... Discriminator Loss: 0.9064... Generator Loss: 1.2262 Sum Loss: 2.1326
Epoch 1/1... Discriminator Loss: 0.8888... Generator Loss: 0.7410 Sum Loss: 1.6297
Epoch 1/1... Discriminator Loss: 1.3440... Generator Loss: 0.6376 Sum Loss: 1.9817
Epoch 1/1... Discriminator Loss: 1.1735... Generator Loss: 1.0453 Sum Loss: 2.2188
Epoch 1/1... Discriminator Loss: 0.9848... Generator Loss: 0.7259 Sum Loss: 1.7107
Epoch 1/1... Discriminator Loss: 0.8933... Generator Loss: 1.2679 Sum Loss: 2.1612
Epoch 1/1... Discriminator Loss: 1.3484... Generator Loss: 0.6005 Sum Loss: 1.9489
Epoch 1/1... Discriminator Loss: 1.4549... Generator Loss: 0.4961 Sum Loss: 1.9510
Epoch 1/1... Discriminator Loss: 1.7412... Generator Loss: 0.8904 Sum Loss: 2.6316
Epoch 1/1... Discriminator Loss: 1.1651... Generator Loss: 0.9437 Sum Loss: 2.1088
Epoch 1/1... Discriminator Loss: 1.3344... Generator Loss: 0.7179 Sum Loss: 2.0522
Epoch 1/1... Discriminator Loss: 1.4341... Generator Loss: 0.8153 Sum Loss: 2.2494
Epoch 1/1... Discriminator Loss: 1.3348... Generator Loss: 0.6809 Sum Loss: 2.0157
Epoch 1/1... Discriminator Loss: 1.1139... Generator Loss: 2.0353 Sum Loss: 3.1491
Epoch 1/1... Discriminator Loss: 1.2912... Generator Loss: 0.7709 Sum Loss: 2.0621
Epoch 1/1... Discriminator Loss: 1.1642... Generator Loss: 1.0383 Sum Loss: 2.2024
Epoch 1/1... Discriminator Loss: 1.4564... Generator Loss: 0.6181 Sum Loss: 2.0745
Epoch 1/1... Discriminator Loss: 1.2177... Generator Loss: 0.5253 Sum Loss: 1.7431
Epoch 1/1... Discriminator Loss: 0.4789... Generator Loss: 1.3559 Sum Loss: 1.8349
Epoch 1/1... Discriminator Loss: 0.5512... Generator Loss: 1.4780 Sum Loss: 2.0292
Epoch 1/1... Discriminator Loss: 1.2197... Generator Loss: 1.1544 Sum Loss: 2.3741
Epoch 1/1... Discriminator Loss: 1.4383... Generator Loss: 0.4498 Sum Loss: 1.8881
Epoch 1/1... Discriminator Loss: 1.4573... Generator Loss: 0.5255 Sum Loss: 1.9827
Epoch 1/1... Discriminator Loss: 1.3137... Generator Loss: 0.6620 Sum Loss: 1.9757
Epoch 1/1... Discriminator Loss: 1.3181... Generator Loss: 0.5476 Sum Loss: 1.8657
Epoch 1/1... Discriminator Loss: 1.4536... Generator Loss: 0.6174 Sum Loss: 2.0709
Epoch 1/1... Discriminator Loss: 1.4259... Generator Loss: 0.7033 Sum Loss: 2.1292
Epoch 1/1... Discriminator Loss: 0.9137... Generator Loss: 1.2519 Sum Loss: 2.1656
Epoch 1/1... Discriminator Loss: 1.2064... Generator Loss: 0.7428 Sum Loss: 1.9492
Epoch 1/1... Discriminator Loss: 1.3383... Generator Loss: 0.7335 Sum Loss: 2.0717
Epoch 1/1... Discriminator Loss: 1.4802... Generator Loss: 0.6618 Sum Loss: 2.1420
Epoch 1/1... Discriminator Loss: 1.1513... Generator Loss: 0.9886 Sum Loss: 2.1399
Epoch 1/1... Discriminator Loss: 1.4287... Generator Loss: 0.4092 Sum Loss: 1.8379
Epoch 1/1... Discriminator Loss: 1.2546... Generator Loss: 1.0681 Sum Loss: 2.3227
Epoch 1/1... Discriminator Loss: 1.5387... Generator Loss: 0.4944 Sum Loss: 2.0331
Epoch 1/1... Discriminator Loss: 1.5577... Generator Loss: 0.6964 Sum Loss: 2.2542
Epoch 1/1... Discriminator Loss: 1.5195... Generator Loss: 0.7226 Sum Loss: 2.2420
Epoch 1/1... Discriminator Loss: 1.6221... Generator Loss: 0.5715 Sum Loss: 2.1935
Epoch 1/1... Discriminator Loss: 1.2147... Generator Loss: 0.5930 Sum Loss: 1.8077
Epoch 1/1... Discriminator Loss: 1.3699... Generator Loss: 0.5419 Sum Loss: 1.9118
Epoch 1/1... Discriminator Loss: 1.1031... Generator Loss: 0.8084 Sum Loss: 1.9115
Epoch 1/1... Discriminator Loss: 1.3101... Generator Loss: 0.6940 Sum Loss: 2.0041
Epoch 1/1... Discriminator Loss: 1.3547... Generator Loss: 0.6579 Sum Loss: 2.0126
Epoch 1/1... Discriminator Loss: 1.4953... Generator Loss: 1.0258 Sum Loss: 2.5211
Epoch 1/1... Discriminator Loss: 1.4115... Generator Loss: 0.5587 Sum Loss: 1.9701
Epoch 1/1... Discriminator Loss: 1.3086... Generator Loss: 0.6953 Sum Loss: 2.0039
Epoch 1/1... Discriminator Loss: 1.0652... Generator Loss: 0.9117 Sum Loss: 1.9769
Epoch 1/1... Discriminator Loss: 1.1801... Generator Loss: 0.7788 Sum Loss: 1.9589
Epoch 1/1... Discriminator Loss: 1.3327... Generator Loss: 0.7049 Sum Loss: 2.0376
Epoch 1/1... Discriminator Loss: 0.9939... Generator Loss: 1.3446 Sum Loss: 2.3384
Epoch 1/1... Discriminator Loss: 0.7189... Generator Loss: 2.0749 Sum Loss: 2.7938
Epoch 1/1... Discriminator Loss: 1.2027... Generator Loss: 0.8790 Sum Loss: 2.0817
Epoch 1/1... Discriminator Loss: 1.7472... Generator Loss: 0.4259 Sum Loss: 2.1731
Epoch 1/1... Discriminator Loss: 1.3754... Generator Loss: 0.5688 Sum Loss: 1.9442
Epoch 1/1... Discriminator Loss: 1.7330... Generator Loss: 0.4910 Sum Loss: 2.2240
Epoch 1/1... Discriminator Loss: 1.1180... Generator Loss: 1.1214 Sum Loss: 2.2394
Epoch 1/1... Discriminator Loss: 0.9624... Generator Loss: 1.5797 Sum Loss: 2.5421
Epoch 1/1... Discriminator Loss: 1.3723... Generator Loss: 0.4745 Sum Loss: 1.8468
Epoch 1/1... Discriminator Loss: 0.9890... Generator Loss: 0.7223 Sum Loss: 1.7112
Epoch 1/1... Discriminator Loss: 1.0968... Generator Loss: 0.9998 Sum Loss: 2.0966
Epoch 1/1... Discriminator Loss: 1.0757... Generator Loss: 0.6638 Sum Loss: 1.7394
Epoch 1/1... Discriminator Loss: 1.3894... Generator Loss: 0.6336 Sum Loss: 2.0230
Epoch 1/1... Discriminator Loss: 1.5085... Generator Loss: 0.9288 Sum Loss: 2.4373
Epoch 1/1... Discriminator Loss: 1.1278... Generator Loss: 0.8509 Sum Loss: 1.9787
Epoch 1/1... Discriminator Loss: 1.2312... Generator Loss: 0.9776 Sum Loss: 2.2087
Epoch 1/1... Discriminator Loss: 1.5161... Generator Loss: 0.6287 Sum Loss: 2.1447
Epoch 1/1... Discriminator Loss: 1.5320... Generator Loss: 0.8152 Sum Loss: 2.3472
Epoch 1/1... Discriminator Loss: 1.4702... Generator Loss: 0.7340 Sum Loss: 2.2042
Epoch 1/1... Discriminator Loss: 0.8915... Generator Loss: 1.1310 Sum Loss: 2.0226
Epoch 1/1... Discriminator Loss: 1.1067... Generator Loss: 0.5474 Sum Loss: 1.6541
Epoch 1/1... Discriminator Loss: 1.5000... Generator Loss: 0.7309 Sum Loss: 2.2309
Epoch 1/1... Discriminator Loss: 1.3902... Generator Loss: 0.5777 Sum Loss: 1.9679
Epoch 1/1... Discriminator Loss: 1.2273... Generator Loss: 0.9987 Sum Loss: 2.2260
Epoch 1/1... Discriminator Loss: 1.2868... Generator Loss: 0.7875 Sum Loss: 2.0743
Epoch 1/1... Discriminator Loss: 1.2941... Generator Loss: 0.9382 Sum Loss: 2.2323
Epoch 1/1... Discriminator Loss: 1.4234... Generator Loss: 0.6461 Sum Loss: 2.0695
Epoch 1/1... Discriminator Loss: 1.1723... Generator Loss: 0.7410 Sum Loss: 1.9134
Epoch 1/1... Discriminator Loss: 1.3427... Generator Loss: 0.7677 Sum Loss: 2.1104
Epoch 1/1... Discriminator Loss: 1.3143... Generator Loss: 0.5776 Sum Loss: 1.8920
Epoch 1/1... Discriminator Loss: 1.0106... Generator Loss: 1.4517 Sum Loss: 2.4623
Epoch 1/1... Discriminator Loss: 0.9796... Generator Loss: 0.7908 Sum Loss: 1.7704
Epoch 1/1... Discriminator Loss: 0.8179... Generator Loss: 1.5791 Sum Loss: 2.3970
Epoch 1/1... Discriminator Loss: 1.5831... Generator Loss: 0.5233 Sum Loss: 2.1064
Epoch 1/1... Discriminator Loss: 1.2393... Generator Loss: 0.6565 Sum Loss: 1.8958
Epoch 1/1... Discriminator Loss: 0.9215... Generator Loss: 1.4938 Sum Loss: 2.4153
Epoch 1/1... Discriminator Loss: 1.2711... Generator Loss: 0.5667 Sum Loss: 1.8378
Epoch 1/1... Discriminator Loss: 1.0922... Generator Loss: 1.6620 Sum Loss: 2.7542
Epoch 1/1... Discriminator Loss: 0.7338... Generator Loss: 0.9234 Sum Loss: 1.6572
Epoch 1/1... Discriminator Loss: 1.1973... Generator Loss: 0.6502 Sum Loss: 1.8475
Epoch 1/1... Discriminator Loss: 1.4275... Generator Loss: 0.5514 Sum Loss: 1.9789
Epoch 1/1... Discriminator Loss: 0.9800... Generator Loss: 1.4289 Sum Loss: 2.4089
Epoch 1/1... Discriminator Loss: 1.1944... Generator Loss: 0.6980 Sum Loss: 1.8924
Epoch 1/1... Discriminator Loss: 1.2927... Generator Loss: 0.6412 Sum Loss: 1.9339
Epoch 1/1... Discriminator Loss: 1.3644... Generator Loss: 1.1120 Sum Loss: 2.4764
Epoch 1/1... Discriminator Loss: 1.5921... Generator Loss: 0.3792 Sum Loss: 1.9713
Epoch 1/1... Discriminator Loss: 1.1852... Generator Loss: 1.0336 Sum Loss: 2.2188
Epoch 1/1... Discriminator Loss: 1.1253... Generator Loss: 0.6510 Sum Loss: 1.7763
Epoch 1/1... Discriminator Loss: 1.5482... Generator Loss: 0.9039 Sum Loss: 2.4521
Epoch 1/1... Discriminator Loss: 1.3958... Generator Loss: 0.6159 Sum Loss: 2.0117
Epoch 1/1... Discriminator Loss: 1.3101... Generator Loss: 0.9258 Sum Loss: 2.2359
Epoch 1/1... Discriminator Loss: 1.0290... Generator Loss: 1.0791 Sum Loss: 2.1080
Epoch 1/1... Discriminator Loss: 1.3600... Generator Loss: 0.7318 Sum Loss: 2.0918
Epoch 1/1... Discriminator Loss: 1.4934... Generator Loss: 0.6972 Sum Loss: 2.1907
Epoch 1/1... Discriminator Loss: 1.2306... Generator Loss: 0.7828 Sum Loss: 2.0134
Epoch 1/1... Discriminator Loss: 2.1030... Generator Loss: 0.9789 Sum Loss: 3.0819
Epoch 1/1... Discriminator Loss: 1.3293... Generator Loss: 0.8898 Sum Loss: 2.2191
Epoch 1/1... Discriminator Loss: 1.3118... Generator Loss: 0.9456 Sum Loss: 2.2574
Epoch 1/1... Discriminator Loss: 1.4216... Generator Loss: 0.6546 Sum Loss: 2.0762
Epoch 1/1... Discriminator Loss: 1.3573... Generator Loss: 0.6344 Sum Loss: 1.9917
Epoch 1/1... Discriminator Loss: 0.8995... Generator Loss: 0.8884 Sum Loss: 1.7879
Epoch 1/1... Discriminator Loss: 1.4640... Generator Loss: 0.6565 Sum Loss: 2.1205
Epoch 1/1... Discriminator Loss: 1.2982... Generator Loss: 1.2922 Sum Loss: 2.5903
Epoch 1/1... Discriminator Loss: 1.1952... Generator Loss: 0.7133 Sum Loss: 1.9085
Epoch 1/1... Discriminator Loss: 1.3787... Generator Loss: 0.8778 Sum Loss: 2.2565
Epoch 1/1... Discriminator Loss: 1.1961... Generator Loss: 0.7810 Sum Loss: 1.9770
Epoch 1/1... Discriminator Loss: 1.5289... Generator Loss: 0.5238 Sum Loss: 2.0527
Epoch 1/1... Discriminator Loss: 1.3813... Generator Loss: 0.5448 Sum Loss: 1.9261
Epoch 1/1... Discriminator Loss: 1.0589... Generator Loss: 0.9682 Sum Loss: 2.0271

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.